# STEM domain expert
Virtuoso Medium V2
Apache-2.0
A 32-billion-parameter language model based on Qwen-2.5-32B architecture, trained through Deepseek-v3 distillation, demonstrating excellent performance in multiple benchmarks.
Large Language Model
Transformers

V
arcee-ai
412
51
Einstein V6.1 Llama3 8B
Other
A large language model fine-tuned on diverse scientific datasets based on Meta-Llama-3-8B, specializing in STEM domain tasks
Large Language Model
Transformers English

E
Weyaxi
70
67
Featured Recommended AI Models